Food videos have become a staple of social media. Picture-perfect sandwiches piled high with impossibly arranged fillings, huge vats of cheesy, creamy pasta, and slow-motion shots of sauces drooling like Homer Simpson: all are only ever a click and a swipe away, and can be hugely profitable.
A video published on Saturday 1 June had these hallmarks for social media success, then added another layer of virality.
“Rice is mixed with plastic bits to increase manufacturer profit!” read the subtitles as the camera zooms in on a frying pan in which tiny rice-like grains turn translucent with the heat.
“Ice cream that bubbles contains washing powder for shine and lightness,” reads another caption as a well-manicured hand squeezes lemon juice over the offending dessert.
In total, the video shows 16 of these “tests” for “fake” food, most featuring a side-by-side shot to give examples of which are “good” or “bad”. Another sees a well-manicured hand empty what the video purports to be baby food into a zip-lock bag. The bag is flattened and evened before a series of fast cuts show a magnet being pulled across the paste, dragging tiny black dots in its wake. “These are ground-up rocks advertised as fortified calcium!” screams the text.
Others allege that pure tea does not stain but “fake” tea will – tell that to any Brit who’s spilled a cuppa – or encourage viewers to set fire to their spices to see if they burn (“pure” spices will catch light, apparently).
By Sunday afternoon, the video had been viewed more than 40 million times on Facebook. By Thursday it was 87 million. More than 500,000 reactions, 216,000 comments and over 3,000,000 shares. By almost any metric it’s a viral media smash hit.
Yet almost none of the claims made in the video have any scientific backing.
“A lot of the claims that are being made [would be] highly illegal in the United States and they would come with penalties,” says Pete Cassell, a spokesman at the US Food and Drug Administration (FDA). “Our food is inspected and monitored and if these kind of things were happening we would know about it and we would take action against a company that was doing something like this.”
The full, three-and-a-half minute video was published by First Media, a Los Angeles-based production house, and published to the verified Facebook page of Blossom, its brand aimed at young women and mothers.
This is not some shady fake news fringe site, but an established company with its own TV channel and viral marketing arm for major brands like Tinder and Pepsi. In 2018 First Media was a finalist in the revered Shorty Awards, nominated for its Facebook presence in the “instructional video” category.
With more than 50 million followers, Blossom is the flagship Facebook page for First Media, driving 97 per cent of its interactions according to Facebook’s social media monitoring tool CrowdTangle.
The video was first posted to the personal Facebook profile of a First Media producer, who wrote “Researcher/Director/Producer/Cinematographer/Editor – ME!” followed by the grimace emoji. “All I wanted was to share awareness,” she added. “Impressive work, well done!” encouraged the company’s chief product officer in the comments.
The emotional share
“To me it seems like scaremongering,” says Emma Goodman, policy officer at the London School of Economics’ Trust, Truth and Technology Commission, when viewing the video for the first time.
“I’m unclear of why someone would put that together. Maybe for ad revenue? I don’t really see what an ulterior motive would be. Maybe aimed at mothers… As I know from my personal experience, when you become a mother you start freaking out about what’s in your food and what you’re feeding your children [so] it’s playing on people’s emotions and parents’ fears.”
The same is true of the anti-vax community, which preys on the fears of new parents to the point where the World Health Organisation reported epidemic levels of measles in Europe in 2018, in which at least 37 people died from a disease wholly preventable by vaccines. Disgraced doctor Andrew Wakefield first claimed vaccines were linked to autism in 1998 and no matter how many times his theory is roundly and rigorously refuted by scientific researchers, online communities continue to spring up to share his claims, feeding off every parent’s desire to protect their children.
“The act of sharing is something you do based on emotion rather than anything else,” says Goodman. “When you feel very strongly about something you do want to share it and especially when it’s something you think people ought to know. And, when you are a young mother, you’re likely to have that community and think ‘I have to share this with people’”.
The FDA makes unannounced inspections at factories on a regular basis, Cassell says, but food standards agencies often rely on reports from consumers to know where and when to take action. Eroding trust in such institutions – a recurring theme of online discourse since at least 2016 – risks undermining their ability to do their job and protect the public.
He explains: “we want to make sure that when a consumer sees advice it has backing and is based on science and are the best recommendations that experts are giving. Not just some rumour from the internet. One of the cornerstones of FDA’s mission is to protect human health and to protect the food supply and to ensure consumers have confidence that the food they’re eating is safe. Issues like this tend to cause confusion among consumers.”
A plastic grain of truth
This is not all to say that the food industry is functioning perfectly, of course. The web is awash with secret footage filmed in meat packing plants or at factory farms, showing animal cruelty, health and safety violations and more. But those are not created in the central LA offices of a viral marketing studio.
“If [the claims in the video were an issue in the US] then the groups that I work with, people would be on to that,” says Thomas Gremillion, director of food policy at the Consumer Federation of America. “We would be talking to the regulatory agencies and there are standards setting bodies that do a pretty good job [of making sure] that food is what it’s supposed to be.”
“We are aware of this video but have seen no evidence of such inauthenticity in many of the products featured. While some of the claims are familiar to us, there is no evidence of a current risk to the public in the UK,” a spokesperson for the UK Food Standards Agency says.
This is because almost none of the claims featured in the video came from US or European sources.
When asked about the research that went into making the video, First Media provided a list of links for each clip. Eleven of the 16 were from a mix of Indian sources relating to alleged food scandals in the country; from the urbane Instagram influencers of New Delhi to national newspapers and rural TV channels. One, relating to red dye on sweet potatoes, came from the official food standards authority in India.
Karen Rebelo works for the Indian fact-checking site Boom Live, part of the International Fact Checking Network, which was in 2016 approved by Facebook to shed light on spurious posts across the platform. She says she repeatedly debunks false claims and videos about viral food scares of the type presented in the Blossom video, even when they come from well-known news organisations in the country.
”We have seen Hindi and regional television news channels picking up viral videos from social media about food scares,” she says. “They amplify and legitimise such claims without showing any journalistic scepticism.”
Fact 1: There is no evidence of plastic rice, FSSAI says
Fact 2: India does not import rice from China https://t.co/2ArjRwh4Pq @boomlive_in— BOOM FactCheck (@boomlive_in) June 11, 2017
Of the rest of the claims, one came from WikiHow, a crowdsourced database of how-to guides, and another was sourced to a paper in an academic journal from 1983. Two of the tests – to find “rocks” in baby food and the claim that natural supplements won’t burn in the oven – appeared to be lifted entirely from one of the sources they provided: a YouTube channel with dozens of videos but just 79 subscribers. While not as slick as Blossom’s version, the 2015 video which the tests come from is similarly light on detail or explanation.
Only one clip, about “meat glue”, had a list of authoritative sources which referenced a common practice in the food industry to bind old meat scraps using a regulator-approved enzyme. The result is controversial among consumers, yet processed cuts are sold by the bucket load in supermarkets and fast food outlets.
When asked directly whether the producer fabricated specific elements of the video to get the desired results – such as adding iron filings to baby food or washing powder to ice cream – First Media declined to comment. A spokesperson for the company said: “The video does not claim that all products or specific manufacturers include these materials, nor does it make any health or nutritional suggestions or recommendations. They are demonstrations of things we consider to be important for our global audience, however this content is intended only for informational purposes and as entertainment.”
Fact-checking’s limited scope
The claims were officially picked apart on Monday in an article by Maarten Schenk, who created social media monitoring tool Trendolizer and runs hoax-busting site Lead Stories, another member of the IFCN. The tests are “not adequately explained, making the results useless”, he wrote.
“I gave it a mixture rating because not everything was false,” he says. “Some of it had a grain of truth in it, even if it was a plastic grain.”
As part of the IFCN, his fact check now appears below the video as “additional reporting” or “related articles” on Facebook when it’s watched by somebody in the UK. A similar Swedish debunk was also added on Wednesday. Yet there is no such warning appearing when the video is watched in Schenk’s native Belgium, he says.
But displaying such a warning would not be enough even if it were done universally. It should have been sent to anyone who had already shared the video, Schenk says, but does not appear next to the video in Facebook users’ feed or on Blossom’s Facebook Page, where 97 per cent of the interactions have come from according to social media monitoring platform CrowdTangle.
Back at LSE, Goodman did not see the warning on mobile, although it appeared on desktop view. “It’s not very clear that it’s a debunk,” she says. “It just says related article and then a link and I don’t feel like that’s really effective. If you’re not already suspicious I wouldn’t think that would make you go further [by clicking to find out more].”
This ineffectiveness is evident in the stats. By Wednesday afternoon Schenk’s article debunking the video had received roughly 80,000 views on Facebook, he says, compared to Blossom’s 80,000,000 views.
Elsewhere on the internet, the video is unaccompanied by any form of debunking. “As far as Twitter is concerned, this video is still being spread unrestricted, same on LinkedIn, same on YouTube,” Schenk adds of the clip, which has been chopped up and repurposed for the myriad world of copycats and viral vampire accounts that latch onto content and claim ownership in the relentless pursuit of clicks.
YouTube meanwhile has been taking a stronger stance in recent months, revoking certain channels’ abilities to earn ad revenue when they are deemed to have broken the platform’s standards. But the problem still remains of the type of content which social networks are set up to reward.
An inherent problem?
When a video roundly dismissed by experts can outpace the truth by a ratio of 1,000 to 1 it naturally brings in to question why the truth lags so far behind.
“One of the problems of fact checking is you never have the reach of the original,” says Goodman, who is among an increasing number of experts in pointing out that the very design of social media platforms may be the issue. Much of her recent work has been in making policy recommendations around social media regulations, and the only “reasonable, proportionate” response, she says, would be to add some kind of obvious warning. But the ongoing dilemma, outside of just viral conspiracies over food scares, concerns the rights of content creators and their freedom of speech.
“I believe in people’s right to free expression but do you have a right to monetise your opinions if they are incredibly hateful or based on lies?” she said.
“Just sharing a video like this illustrates the weakness of social media,” agrees Gremillion. “What are you supposed to do with this video? Start lighting your food on fire and get a bunch of magnets in the kitchen? It seems like a waste of people’s time.”
Social networks may boost content that elicits a strong emotional response more than anything else “but there’s been quite a lot of discussion about that and whether social networks can change that so they’re not rewarding this very negative content in the same way,” says Goodman.
Emboldened by its barnstorming inquiry into “disinformation and fake news”, which made headlines with the Cambridge Analytica scandal, the select committee for parliament’s Department for Digital, Culture, Media and Sport launched a new inquiry into “immersive and addictive technologies” last December.
Leading figures from YouTube and Instagram were grilled by the committee in mid-May about the addictive nature and knock-on effects of their platforms on users.
“There’s been a lot of discussion around design and how you need to build more ethical principles in to the very design of social networks,” Goodman says. “Right from the start you have to have that built into how things are designed and you have to find a way to build products in such a way that they reward quality content.”
First Media believes its video offering 16 “experiments” by which to test food is quality content, and it has certainly been rewarded.
“Blossom’s research and production teams work very hard to ensure high quality and visually entertaining content that intrigues the natural curiosity and critical thinking of our audience and supplies value and ideas. Many of the examples presented in the video are, indeed, disturbing,” the spokesperson offered. “This video offers information from a variety of reputable and globally-recognized sources already available to the average consumer.”
“I think the fundamental problem is human nature,” says Goodman. “That’s really the issue here but we can’t really do much about that.”
This piece also appears on First Draft